26 research outputs found

    Sparse Image Reconstruction in Computed Tomography

    Get PDF

    Noise Robustness of a Combined Phase Retrieval and Reconstruction Method for Phase-Contrast Tomography

    Get PDF
    Classical reconstruction methods for phase-contrast tomography consist of two stages: phase retrieval and tomographic reconstruction. A novel algebraic method combining the two was suggested by Kostenko et al. (Opt. Express, 21, 12185, 2013) and preliminary results demonstrating improved reconstruction compared to a two-stage method given. Using simulated free-space propagation experiments with a single sample-detector distance, we thoroughly compare the novel method with the two-stage method to address limitations of the preliminary results. We demonstrate that the novel method is substantially more robust towards noise; our simulations point to a possible reduction in counting times by an order of magnitude

    Testable uniqueness conditions for empirical assessment of undersampling levels in total variation-regularized X-ray CT

    Get PDF
    We study recoverability in fan-beam computed tomography (CT) with sparsity and total variation priors: how many underdetermined linear measurements suffice for recovering images of given sparsity? Results from compressed sensing (CS) establish such conditions for, e.g., random measurements, but not for CT. Recoverability is typically tested by checking whether a computed solution recovers the original. This approach cannot guarantee solution uniqueness and the recoverability decision therefore depends on the optimization algorithm. We propose new computational methods to test recoverability by verifying solution uniqueness conditions. Using both reconstruction and uniqueness testing we empirically study the number of CT measurements sufficient for recovery on new classes of sparse test images. We demonstrate an average-case relation between sparsity and sufficient sampling and observe a sharp phase transition as known from CS, but never established for CT. In addition to assessing recoverability more reliably, we show that uniqueness tests are often the faster option.Comment: 18 pages, 7 figures, submitte

    Stopping Rules for Algebraic Iterative Reconstruction Methods in Computed Tomography

    Full text link
    Algebraic models for the reconstruction problem in X-ray computed tomography (CT) provide a flexible framework that applies to many measurement geometries. For large-scale problems we need to use iterative solvers, and we need stopping rules for these methods that terminate the iterations when we have computed a satisfactory reconstruction that balances the reconstruction error and the influence of noise from the measurements. Many such stopping rules are developed in the inverse problems communities, but they have not attained much attention in the CT world. The goal of this paper is to describe and illustrate four stopping rules that are relevant for CT reconstructions.Comment: 11 pages, 10 figure

    Empirical average-case relation between undersampling and sparsity in X-ray CT

    Get PDF
    Abstract. In x-ray computed tomography (CT) it is generally acknowledged that reconstruction methods exploiting image sparsity allow reconstruction from a significantly reduced number of projections. The use of such recon-struction methods is motivated by recent progress in compressed sensing (CS). However, the CS framework provides neither guarantees of accurate CT re-construction, nor any relation between sparsity and a sufficient number of measurements for recovery, i.e., perfect reconstruction from noise-free data. We consider reconstruction through 1-norm minimization, as proposed in CS, from data obtained using a standard CT fan-beam sampling pattern. In em-pirical simulation studies we establish quantitatively a relation between the image sparsity and the sufficient number of measurements for recovery within image classes motivated by tomographic applications. We show empirically that the specific relation depends on the image class and in many cases ex-hibits a sharp phase transition as seen in CS, i.e. same-sparsity image require the same number of projections for recovery. Finally we demonstrate that th

    Effect of sparsity and exposure on total variation regularized X-ray tomography from few projections

    Get PDF
    We address effects of exposure and image gradient sparsity for total variation-regularized reconstruction: is it better to collect many low-quality or few high-quality projections, and can gradient sparsity predict how many projections are necessary? Preliminary results suggest collecting many low-quality projections is favorable, and that a link may exist between gradient sparsity level and successful reconstruction

    SparseBeads data: benchmarking sparsity-regularized computed tomography

    Get PDF
    Sparsity regularization (SR) such as total variation (TV) minimization allows accurate image reconstruction in x-ray computed tomography (CT) from fewer projections than analytical methods. Exactly how few projections suffice and how this number may depend on the image remain poorly understood. Compressive sensing connects the critical number of projections to the image sparsity, but does not cover CT, however empirical results suggest a similar connection. The present work establishes for real CT data a connection between gradient sparsity and the sufficient number of projections for accurate TV-regularized reconstruction. A collection of 48 x-ray CT datasets called SparseBeads was designed for benchmarking SR reconstruction algorithms. Beadpacks comprising glass beads of five different sizes as well as mixtures were scanned in a micro-CT scanner to provide structured datasets with variable image sparsity levels, number of projections and noise levels to allow the systematic assessment of parameters affecting performance of SR reconstruction algorithms6. Using the SparseBeads data, TV-regularized reconstruction quality was assessed as a function of numbers of projections and gradient sparsity. The critical number of projections for satisfactory TV-regularized reconstruction increased almost linearly with the gradient sparsity. This establishes a quantitative guideline from which one may predict how few projections to acquire based on expected sample sparsity level as an aid in planning of dose- or time-critical experiments. The results are expected to hold for samples of similar characteristics, i.e. consisting of few, distinct phases with relatively simple structure. Such cases are plentiful in porous media, composite materials, foams, as well as non-destructive testing and metrology. For samples of other characteristics the proposed methodology may be used to investigate similar relations
    corecore